Boosting by weighting critical and erroneous samples
نویسندگان
چکیده
Real Adaboost is a well-known and good performance boosting method used to build machine ensembles for classification. Considering that its emphasis function can be decomposed in two factors that pay separated attention to sample errors and to their proximity to the classification border, a generalized emphasis function that combines both components by means of a selectable parameter, l, is presented. Experiments show that simple methods of selecting l frequently offer better performance and smaller ensembles.
منابع مشابه
Boosting by weighting boundary and erroneous samples
This paper shows that new and flexible criteria to resample populations in boosting algorithms can lead to performance improvements. Real Adaboost emphasis function can be divided into two different terms, the first only pays attention to the quadratic error of each pattern and the second takes only into account the “proximity” of each pattern to the boundary. Here, we incorporate an additional...
متن کاملBoosting in Probabilistic Neural Networks
The basic idea of boosting is to increase the pattern recognition accuracy by combining classifiers which have been derived from differently weighted versions of the original training data. It has been verified in practical experiments that the resulting classification performance can be improved by increasing the weights of misclassified training samples. However, in statistical pattern recogn...
متن کاملPattern Classification Using Support Vector Machine Ensemble
While the support vector machine (SVM) can provide a good generalization performance, the classification result of the SVM is often far from the theoretically expected level in practical implementation because they are based on approximated algorithms due to the high complexity of time and space. To improve the limited classification performance of the real SVM, we propose to use an SVM ensembl...
متن کاملOutlier Detection by Boosting Regression Trees
A procedure for detecting outliers in regression problems is proposed. It is based on information provided by boosting regression trees. The key idea is to select the most frequently resampled observation along the boosting iterations and reiterate after removing it. The selection criterion is based on Tchebychev’s inequality applied to the maximum over the boosting iterations of ...
متن کاملDesigning neural network committees by combining boosting ensembles
The use of modified Real Adaboost ensembles by applying weighted emphasis on erroneous and critical (near the classification boundary) has been shown to lead to improved designs, both in performance and in ensemble sizes. In this paper, we propose to take advantage of the diversity among different weighted combination to build committees of modified Real Adaboost designs. Experiments show that ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 69 شماره
صفحات -
تاریخ انتشار 2006